Impact of language on functional connectivity for audiovisual speech integration
نویسندگان
چکیده
Visual information about lip and facial movements plays a role in audiovisual (AV) speech perception. Although this has been widely confirmed, previous behavioural studies have shown interlanguage differences, that is, native Japanese speakers do not integrate auditory and visual speech as closely as native English speakers. To elucidate the neural basis of such interlanguage differences, 22 native English speakers and 24 native Japanese speakers were examined in behavioural or functional Magnetic Resonance Imaging (fMRI) experiments while mono-syllabic speech was presented under AV, auditory-only, or visual-only conditions for speech identification. Behavioural results indicated that the English speakers identified visual speech more quickly than the Japanese speakers, and that the temporal facilitation effect of congruent visual speech was significant in the English speakers but not in the Japanese speakers. Using fMRI data, we examined the functional connectivity among brain regions important for auditory-visual interplay. The results indicated that the English speakers had significantly stronger connectivity between the visual motion area MT and the Heschl's gyrus compared with the Japanese speakers, which may subserve lower-level visual influences on speech perception in English speakers in a multisensory environment. These results suggested that linguistic experience strongly affects neural connectivity involved in AV speech integration.
منابع مشابه
Impact of language on audiovisual speech perception examined by fMRI
Both auditory and visual information plays an important role for audiovisual speech perception during face-to-face communication. Several behavioral studies have shown that native English speakers and native Japanese speakers behaved differently in audiovisual speech perception. We hypothesized that there would be differences in neural processing between native English speakers and native Japan...
متن کاملDynamic changes in superior temporal sulcus connectivity during perception of noisy audiovisual speech.
Humans are remarkably adept at understanding speech, even when it is contaminated by noise. Multisensory integration may explain some of this ability: combining independent information from the auditory modality (vocalizations) and the visual modality (mouth movements) reduces noise and increases accuracy. Converging evidence suggests that the superior temporal sulcus (STS) is a critical brain ...
متن کاملBeta-Band Functional Connectivity Influences Audiovisual Integration in Older Age: An EEG Study
Audiovisual integration occurs frequently and has been shown to exhibit age-related differences via behavior experiments or time-frequency analyses. In the present study, we examined whether functional connectivity influences audiovisual integration during normal aging. Visual, auditory, and audiovisual stimuli were randomly presented peripherally; during this time, participants were asked to r...
متن کاملNeural development of networks for audiovisual speech comprehension.
Everyday conversation is both an auditory and a visual phenomenon. While visual speech information enhances comprehension for the listener, evidence suggests that the ability to benefit from this information improves with development. A number of brain regions have been implicated in audiovisual speech comprehension, but the extent to which the neurobiological substrate in the child compares to...
متن کاملMusical expertise is related to altered functional connectivity during audiovisual integration.
The present study investigated the cortical large-scale functional network underpinning audiovisual integration via magnetoencephalographic recordings. The reorganization of this network related to long-term musical training was investigated by comparing musicians to nonmusicians. Connectivity was calculated on the basis of the estimated mutual information of the sources' activity, and the corr...
متن کامل